1,301 research outputs found
Component SPD Matrices: A lower-dimensional discriminative data descriptor for image set classification
In the domain of pattern recognition, using the SPD (Symmetric Positive
Definite) matrices to represent data and taking the metrics of resulting
Riemannian manifold into account have been widely used for the task of image
set classification. In this paper, we propose a new data representation
framework for image sets named CSPD (Component Symmetric Positive Definite).
Firstly, we obtain sub-image sets by dividing the image set into square blocks
with the same size, and use traditional SPD model to describe them. Then, we
use the results of the Riemannian kernel on SPD matrices as similarities of
corresponding sub-image sets. Finally, the CSPD matrix appears in the form of
the kernel matrix for all the sub-image sets, and CSPDi,j denotes the
similarity between i-th sub-image set and j-th sub-image set. Here, the
Riemannian kernel is shown to satisfy the Mercer's theorem, so our proposed
CSPD matrix is symmetric and positive definite and also lies on a Riemannian
manifold. On three benchmark datasets, experimental results show that CSPD is a
lower-dimensional and more discriminative data descriptor for the task of image
set classification.Comment: 8 pages,5 figures, Computational Visual Media, 201
Identification of neprilysin as a potential target of arteannuin using computational drug repositioning
The discovery of arteannuin (qinghaosu) in the 20th Century was a major advance for medicine. Besides functioning as a malaria therapy, arteannuin is a pharmacological agent in a range of other diseases, but its mechanism of action remains obscure. In this study, the reverse docking server PharmMapper was used to identify potential targets of arteannuin. The results were checked using the chemical-protein interactome servers DRAR-CPI and DDI-CPI, and verified by AutoDock Vina. The results showed that neprilysin (also known as CD10), a common acute lymphoblastic leukaemia antigen, was the top disease-related target of arteannuin. The chemical-protein interactome and docking results agreed with those of PharmMapper, further implicating neprilysin as a potential target. Although experimental verification is required, this study provides guidance for future pharmacological investigations into novel clinical applications for arteannuin
Riemannian kernel based Nystr\"om method for approximate infinite-dimensional covariance descriptors with application to image set classification
In the domain of pattern recognition, using the CovDs (Covariance
Descriptors) to represent data and taking the metrics of the resulting
Riemannian manifold into account have been widely adopted for the task of image
set classification. Recently, it has been proven that infinite-dimensional
CovDs are more discriminative than their low-dimensional counterparts. However,
the form of infinite-dimensional CovDs is implicit and the computational load
is high. We propose a novel framework for representing image sets by
approximating infinite-dimensional CovDs in the paradigm of the Nystr\"om
method based on a Riemannian kernel. We start by modeling the images via CovDs,
which lie on the Riemannian manifold spanned by SPD (Symmetric Positive
Definite) matrices. We then extend the Nystr\"om method to the SPD manifold and
obtain the approximations of CovDs in RKHS (Reproducing Kernel Hilbert Space).
Finally, we approximate infinite-dimensional CovDs via these approximations.
Empirically, we apply our framework to the task of image set classification.
The experimental results obtained on three benchmark datasets show that our
proposed approximate infinite-dimensional CovDs outperform the original CovDs.Comment: 6 pages, 3 figures, International Conference on Pattern Recognition
201
Approximation of Nonlinear Functionals Using Deep ReLU Networks
In recent years, functional neural networks have been proposed and studied in
order to approximate nonlinear continuous functionals defined on for integers and . However, their theoretical
properties are largely unknown beyond universality of approximation or the
existing analysis does not apply to the rectified linear unit (ReLU) activation
function. To fill in this void, we investigate here the approximation power of
functional deep neural networks associated with the ReLU activation function by
constructing a continuous piecewise linear interpolation under a simple
triangulation. In addition, we establish rates of approximation of the proposed
functional deep ReLU networks under mild regularity conditions. Finally, our
study may also shed some light on the understanding of functional data learning
algorithms
- …